Dechert Cyber ​​Bits - Issue 19 |Dechert LLP - JDSupra

2022-08-20 01:37:10 By : Mr. Frank Zhang

CJEU: Special Category Data Just Got More Complicated

On August 1, 2022, the Court of Justice of the European Union (“CJEU”) delivered a preliminary ruling on the legal interpretation of special categories of personal data under the European General Data Protection Regulation ("EU GDPR").

The case concerns a director (“OT”) of a Lithuanian organization which received public funding. Because of his position, under Lithuanian anti-corruption law the director was obliged to disclose his private interests via a declaration with the Chief Ethics Commission (“CEC”), including information such as ID numbers, social security numbers, details of his spouse/partner and transactions carried out in the last year exceeding €3,000. Except for ID numbers, social security numbers and ‘special personal data’, this information is published on the CEC’s website.

The referring Lithuanian court was concerned that details as to OT’s spouse/partner were capable of being used to make inferences about sexual orientation. The CJEU effectively found that information from which sexual orientation can be indirectly inferred is protected as special category data.

The CJEU also held that, while there is a significant public interest in avoiding corruption that justifies the disclosure of private interests to the CEC, the obligation to publish this information on the CEC’s website was disproportionate and exposed OT to a number of risks, including targeted advertising and even criminal behavior. In the Court’s view, publication to a smaller group of people would have been equally effective.

Takeaway: Although the court’s ruling relates specifically to sexual orientation data, CJEU judgments need to be read not in a fact specific manner but instead by looking at the application of underlying principles. Therefore, it is clear that the CJEU is prepared to take an expansive approach which is very likely to apply across all special categories of data. Businesses need to proactively review their processing and consider whether any special categories of data may be inferred from the information they hold.

While the ADPPA Remains Under Consideration in the House, Two Children’s Privacy Bills Advance in the Senate

As previously reported in the last issue of Cyber Bits, the House Committee on Energy & Commerce recently voted to refer an amended version of the American Data Privacy and Protection Act (“ADPPA”) to the full House of Representatives. But the ADPPA faces significant hurdles in the Senate: Senator Maria Cantwell (D-Wash.) (Chair of the Senate Commerce Committee) recently stated that she is not planning for the Committee to consider the ADPPA, effectively blocking the legislation in the Senate. Instead, Senate Democrats have been focused on expanding protections for children with an updated version of the Children’s Online Privacy Protection Act (“COPPA 2.0”) and the new Kids Online Safety Act (“KOSA”). Both bills were voted out of the Commerce Committee and are headed to the Senate floor for a full vote.

Passed in 1998, COPPA regulates the collection of personal information directly from children under the age of 13. COPPA 2.0 would establish additional protections and expand portions of COPPA to older children. If enacted, the measure would leave intact COPPA’s definition of “child” as a person age 12 or under but add a new definition to classify children age 13 through 16 (or 15 in some drafts) as “minors.” The amended statute would build on COPPA by limiting the collection of personal information from “minors.” Other key provisions would ban targeted marketing to “minors;” require companies to allow parents and “minors” to delete a “minor’s” personal information; and create a Youth Privacy and Marketing Division within the FTC to address online marketing and privacy issues related to children and “minors.”

While COPPA tackles data collection, KOSA addresses platform design—specifically concerns about how platforms may harm young users. To combat what some contend are the negative effects of social media, KOSA would create a duty for social media platforms to prevent and mitigate harm to minors, including content encouraging eating disorders, suicide, substance abuse, and self-harm (though what exactly that mitigation involves remains vague). The legislation would also require platforms, among other things, to automatically apply the strictest privacy settings to minors, and engage in annual independent audits to evaluate compliance and risks to minors.

Takeaway: The Senate Commerce Committee’s approval of COPPA 2.0 and KOSA theoretically move these bills closer to the finish line. However, significant hurdles remain. Congress is now faced with deciding whether it will focus on passing comprehensive federal privacy legislation (the ADDPA) or whether there is a greater likelihood that targeted privacy legislation (COPPA 2.0 and/or KOSA) is more likely to be approved. The progress of these bills should be closely monitored. In light of the underlying policy goals of KOSA and COPPA 2.0, companies will want to be mindful of how they target or collect data from children and minors under the age of 17.

Criteo Faces €60m Fine for GDPR Violations in France

The French data protection authority, the Commission Nationale de l’Informatique et des Libertés (“CNIL”), has reportedly issued a preliminary decision finding French AdTech giant Criteo to be in breach of the European General Data Protection Regulation ("EU GDPR") and imposing a fine of €60 million (approximately USD 61.03 million) following a multi-year investigation.

A formal complaint was initially filed by Privacy International, a UK-based privacy advocacy group, in 2018, claiming Criteo operated a “manipulation machine” by using a range of tracking techniques and data processing practices designed to build profiles of internet users in order to target them with behavioral advertising without any legal basis under the GDPR. The CNIL investigation also looked into complaints made by noyb (the Max Schrems-fronted privacy activism organization).

The CNIL issued its preliminary decision to Criteo which released a statement on its website outlining its disagreement with the merits of assertions of non-compliance as well as the quantum of the fine. Criteo is able to make representations to the CNIL following which there will be a formal hearing before the decision is finalized, which is not expected to be until 2023.

Takeaway: This latest decision (even if only preliminary at this stage) demonstrates the increased scrutiny of AdTech in the EU, following the Belgian authority’s finding against IAB Europe earlier this year, and the suite of legislative proposals under the EU’s digital and data strategies affecting AdTech operations. The size of the proposed fine also indicate that the AdTech sector is a top enforcement priority for EU data protection authorities. AdTech companies may want to re-evaluate their targeting advertising strategies in light of the proposed legislation.

On July 25, 2022, the UK Information Commissioner’s Office (“ICO”) published updated guidance for the approval of binding corporate rules (“BCRs”) which aims to simplify the application process.

BCRs serve as an available mechanism for making transfers of personal data to third countries which would otherwise be restricted. BCRs originated under EU law and continue to be part of UK law under the UK General Data Protection Regulation (“UK GDPR”). They are for use by multinational company groups and comprise internal rules or policies, which are legally binding and enforceable. BCRs must be approved by the ICO to be a valid transfer mechanism, and the approval process is usually lengthy.

In recognition of the fact that BCR applicants may seek BCRs both in the EU and the UK and that there are currently overlaps in both jurisdictions, the ICO has updated its guidance to simplify the approval process. The updated guidance also reflects the Schrems II judgment issued by the Court of Justice of the European Union (“CJEU”), which requires additional steps to use BCRs as a transfer mechanism, such as conducting a data transfer impact assessment to safeguard sufficient protection of personal data.

Takeaway: Businesses wishing to complete a BCR application under the UK GDPR must now use the UK ICO’s updated application forms and tables, referring to the new guidance. The updates should help provide clarity and, hopefully, a quicker timeframe to approval.

The Federal Communications Commission (“FCC”) recently released an alert to consumers regarding the rising threat of robotexts. While the FCC does not track call or text volume, it has seen a dramatic increase in the number of complaints about unwanted text messages, growing from approximately 5,700 in 2019 to 15,300 in 2021. And the FCC estimates that it has already received as many as 8,500 complaints this year through June 2022. Independent reports, such as RoboKiller, estimate that consumers received over 12 billion robotexts in June 2022 alone. This increase is in contrast to the number of spam robocalls, which appears to have fallen by nearly 50% since the FCC required many phone companies to install spam-blocking technology last year.

Robotext scams, like robocallers, use fear and anxiety to deceive consumers. Text messages may include false but believable claims about bank account issues, package delivery mistakes, unpaid bills, or potential law enforcement actions. Additionally, scam artists often provide confusing or incomplete information to spur curiosity and engagement. While many robotexts scams are after money, others attempt to collect personal information or simply confirm that a telephone number is active for use in future scams.

In response to the growing robotext threat, the FCC is partnering with state Attorneys General to pool investigation resources to combat robocalls and robotexts more effectively. Attorneys General from all 50 states also recently launched an Anti-Robocall Litigation Task Force to investigate and prosecute telecom companies that bring foreign robocalls into the United States. FCC Chair Jessica Rosenworcel has also proposed updating robotext rules to require mobile phone companies to block illegal robotexts and to consider whether authentication technology similar to caller ID might be applied to text messaging. This rulemaking proposal remains pending before the full Commission.

Takeaway: The FCC and State AGs are currently focused on robocalls and robotexts, potentially heightening enforcement risks for violations of relevant statutes. Businesses must ensure that their telephone and SMS/text marketing practices are compliant with applicable laws. Notably, the FCC considers robotexts to be in the same category as robocalls under the TCPA. This means that entities responsible for texts must, among other things, not contact consumers before 8 AM or after 9 PM and avoid sending unsolicited text messages to recipients on the National Do Not Call Registry.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Dechert LLP | Attorney Advertising

This website uses cookies to improve user experience, track anonymous site usage, store authorization tokens and permit sharing on social media networks. By continuing to browse this website you accept the use of cookies. Click here to read more about how we use cookies.

Copyright © JD Supra, LLC